An introduction to patter

Edward Lavender

Eawag, Swiss Federal Institute for Aquatic Science and Technology

Introduction

Introduction

https://github.com/edwardlavender/patter

Introduction

  • patter is a R package in the movement-ecology ecosystem
  • patter was motivated by our research in passive acoustic telemetry systems
  • patter fits state-space models to animal-tracking data, using particle filtering and smoothing algorithms

Team

Institutions

Workshop

  • The goal today is to introduce patter
  • We will focus on how to use the package
  • There will be:
    • Interactive presentation & questions
    • (Time allowing) Optional practical activity (using example datasets or your own)

Context

  • There are many packages for animal tracking (Joo et al. 2020)

  • Numerous packages are available for passive acoustic telemetry (VTrack, glatos, RSP, etc.)

  • patter is a unique addition that fits SSMs to animal-tracking data using particle and smoothing algorithms

  • patter can incorporate a wide range of data types

Important

For passive acoustic telemetry, patter is the only package that reconstructs individual movements and patterns of space use within a coherent probabilistic framework.

Context

  • patter uses a dedicated high-performance backend written in Julia

https://github.com/edwardlavender/Patter.jl

Evolution

  • patter evolved from flapper (Lavender et al., 2023) https://github.com/edwardlavender/flapper

Evolution

  • flapper is a more general-purpose package for passive acoustic telemetry that supports:
    • data acquisition, simulation, assembly and processing
    • spatial operations and distance metrics (e.g., fast C++ LCP algorithms)
    • data exploration (detection statistics, movement metrics)
    • ‘flapper’ algorithms

Evolution

  • patter focuses specifically on model inference for SSMs using particle algorithms

https://github.com/edwardlavender/patter

Methodology

Methodology

  • For the methodology, see:

Lavender, E. et al. (2024a). Particle algorithms for animal movement modelling in autonomous receiver networks. bioRxiv 2024.09.16.613223. https://doi.org/10.1101/2024.09.16.613223

  • For a succinct mathematical summary, see the package paper:

Lavender, E. et al. (2024b). patter: particle algorithms for animal tracking in R and Julia. bioRxiv 2024.07.30.605733. https://doi.org/10.1101/2024.07.30.605733

  • For a comprehensive algorithmic explanation for mathematical and non-mathematical readers:

Lavender, E. et al. (2024a). Supporting Information.

  • For an intuitive summary, see:

vignette("a-methodology", package = "patter")

Set up

Operating systems

  • patter v.1.0.0 supports Windows and MacOS
  • patter@dev supports Linux and will be rolled out soon

Note

There are some special considerations for running patter on linux that we won’t discuss today. Please get in touch if you are running patter on HPC linux environments.

Installation

  • To install patter, follow the instructions on https://github.com/edwardlavender/patter
    • Install R and Julia
    • On Windows, install RTools
    • On MacOS, install Xcode (and configure .Makevars as required)
    • Install patter (and let it handle Julia dependencies)

Project set up

  1. (optional) Set up an RStudio project
  2. (optional) Initiate local-dependency management with renv
  3. (optional) Build project directories (data/, data-raw/, R/, Julia/, …)
  4. Install patter
  5. (optional) Set Julia options (next slide)
  6. Connect to Julia

Note

renv is an R package that implements local-dependency management. In Julia, we get this ‘for free’. In patter, set JULIA_PROJ to use a local package environment for the Julia dependencies (recommended).

Project set up: JULIA options

  • JULIA_HOME is the location of the Julia installation

    • Usually, JULIA_HOME is found automatically, but it may be required if Julia is installed in a non-standard location
    • E.g., on an Intel Mac JULIA_HOME may be something like /Applications/Julia-1.10.app/Contents/Resources/julia/bin/
  • (optional) JULIA_PROJ is the location of the Julia project (./Julia/)

  • (optional) JULIA_NUM_THREADS is the number of threads.

Note

On Windows, JULIA_NUM_THREADS must be set system-wide as an environment variable. Follow the guidance here: https://github.com/edwardlavender/patter/issues/13.

Project set up: JULIA options

  • It is a good idea to set JULIA options globally via .Rprofile (or .Renviron):
# Set JULIA_HOME
# * This is usually not required

# (optional) Set JULIA_PROJ to ./Julia/
if (!requireNamespace("here", quietly = TRUE)) {
  install.packages("here")
}
Sys.setenv(JULIA_PROJ = here::here("Julia"))

# (optional) Set JULIA_NUM_THREADS
# * This works on MacOS and Linux 
Sys.setenv(JULIA_NUM_THREADS = parallel::detectCores())

# (optional) Set aditional options 
Sys.setenv(OMP_NUM_THREADS = parallel::detectCores())

Connect to Julia

  • After library(patter), run julia_connect() to connect R to Julia. This:
    1. Sets up JuliaCall (which handles the RJulia coupling)
    2. Validates the Julia installation (@ JULIA_HOME)
    3. (Optional) Activates a local Julia project (JULIA_PROJ)
    4. Installs and pre-compiles Julia dependencies (e.g., Patter.jl)
  • In general, you should run julia_connect() once per R session.

Note

The first call to julia_connect() may take several minutes (or longer) as Julia packages are installed and pre-compiled. Subsequent calls are faster.

Connect to Julia

  • The syntax for julia_connect() is:
julia_connect(..., JULIA_PROJ, .threads, ...)
  • In patter@dev, this is:
julia_connect(JULIA_HOME, JULIA_PROJ, JULIA_NUM_THREADS, ...)
  • If you’ve set environmental variables (recommended), just use:
julia_connect()

Overview

Overview

The patter workflow. ## Overview

Overview

  • We will work though:
    • Boilerplate setup
    • An example workflow with real data
    • An example simulation-based workflow

Note

For a summary of the workflow, see vignette("b-workflow-outline", package = "patter"). See the README for an example workflow. patter functions are well documented. Please read them carefully and follow the links to the Patter.jl documentation where directed. Help us to improve the documentation by submitting GitHub issues.

Boilerplate setup

Boilerplate setup

  • Load and attach patter and supporting packages:
# Load & attach patter
library(patter)
# Ancillary packages
library(data.table)
library(dtplyr)
library(dplyr, warn.conflicts = FALSE)
library(glue)
library(JuliaCall)
library(spatstat.explore)
library(truncdist)
  • Use patter.verbose to control user output messages:
# (patter.verbose = TRUE by default)
op <- options(patter.verbose = FALSE)
  • See ?patter-progress for additional options

Boilerplate setup

  • Connect to Julia:
julia_connect()
  • Set the seed in R and Julia via set_seed():
set_seed(2024 - 10 - 18)

An example workflow with real data

Study system

  • This first part of this presentation walks though how to reconstruct movements and patterns of space use using example data collected from flapper (Dipturus intermedius) in Scotland.

The Loch Sunart to the Sound of Jura Marine Protected Area.

Study system

Observations include detections at receivers:

head(dat_acoustics)
   individual_id           timestamp receiver_id
           <int>              <POSc>       <int>
1:            25 2016-03-17 01:50:00          26
2:            25 2016-03-17 01:52:00          26
3:            25 2016-03-17 01:54:00          26
4:            25 2016-03-17 01:58:00          26
5:            25 2016-03-17 02:00:00          26
6:            25 2016-03-17 02:04:00          26

Note

dat_acoustics will soon be renamed dat_detections to avoid confusion between detection data (which comprise detections alone) and acoustic data (which comprise detections and non-detections, which are implicitly known for each time step).

Study system

This is the receiver metadata:

head(dat_moorings)
   receiver_id receiver_start receiver_end receiver_x receiver_y receiver_alpha
         <int>         <POSc>       <POSc>      <num>      <num>          <num>
1:           3     2016-03-03   2016-11-02   706442.1    6254007              4
2:           4     2016-03-03   2017-03-08   709742.1    6267707              4
3:           7     2016-03-03   2016-11-26   708742.1    6269107              4
4:           9     2016-03-03   2016-10-13   706042.1    6254307              4
5:          11     2016-03-03   2016-11-18   707542.1    6267707              4
6:          12     2016-03-03   2017-03-08   710042.1    6267307              4
   receiver_beta receiver_gamma
           <num>          <num>
1:         -0.01            750
2:         -0.01            750
3:         -0.01            750
4:         -0.01            750
5:         -0.01            750
6:         -0.01            750

Tip

To incorporate passive acoustic telemetry datasets in patter, follow the format for dat_acoustics and dat_moorings. There is a pat_setup_data() function that you can use to check these datasets. The receiver_alpha, receiver_beta and receiver_gamma columns optionally define receiver-specific parameters of a truncated, distance-decaying logistic detection probability model but (as we will see later) we can use any observational model of our choosing.

Study system

We also have archival (depth) observations, collected at a resolution of two-minutes:

head(dat_archival)
   individual_id           timestamp  depth
           <num>              <POSc>  <num>
1:            25 2016-03-16 00:00:00 159.15
2:            25 2016-03-16 00:02:00 161.01
3:            25 2016-03-16 00:04:00 162.86
4:            25 2016-03-16 00:06:00 164.48
5:            25 2016-03-16 00:08:00 158.46
6:            25 2016-03-16 00:10:00 155.45

Study system

  • Our goal in the first part of this presentation is to reconstruct movements and patterns of space use for an example individual given these data.
# Define detections
det <-
  dat_acoustics |>
  filter(individual_id == 25L) |>
  mutate(individual_id = NULL) |>
  as.data.table()
# Define archival (depth) observations
arc <-
  dat_archival |>
  filter(individual_id == 25L) |>
  mutate(individual_id = NULL) |>
  as.data.table()

Study system

plot(arc$timestamp, arc$depth * -1, 
     col = "royalblue", type = "l",
     ylim = c(-max(arc$depth), 0), 
     xlab = "Time stamp", ylab = "Depth (m)")
points(det$timestamp, rep(0, nrow(det)))

Study system

  • We need to define a map of our study system in R and Julia

Note

We use a coarse raster for convenience only. For a real analysis, a better map would be desirable.

Study system

Study system

  • The map is a raster that sets the spatial domain of our analyses and defines the region(s) within which movements are possible
# Read SpatRaster
map <- dat_gebco()

# dat_gebco() is short-hand for the following:
# spat.tif <- system.file("extdata", "dat_gebco.tif", package = "patter", mustWork = TRUE)
# map      <- terra::rast(spat.tif)

# Export SpatRaster as `env` to Julia
set_map(map)

Note

In patter v.1.0.0, in the particle filtering routines, initial particles are sampled in R from a SpatRaster, which can differ from the map (env) in Julia. In patter@dev, initial particles are sampled in Julia and set_map() includes .as_Raster and .as_GeoArrays arguments if you need to distinguish the two maps. Use .as_Raster = TRUE to define the initial map from which particles are sampled. Use .as_GeoArrays = TRUE to define the map for the movement model.

Study system

  • Cell entries should be floats
  • Code inhospitable habitats (e.g., land) as NA_real_
  • A planar (UTM) coordinate reference system is required

Note

The planar raster format is used for computational efficiency. In our applications, the map is often a bathymetry SpatRaster. The next version of patter should include routines to facilitate the creation of SpatRaster if you have a shapefile (or similar). For now, see terra::rasterize() to convert a shapefile to a SpatRaster. Raster resolution can be as high as desired, providing the data fits in memory. There is no (minimal?) speed penalty for higher resolution rasters.

Study system

  • Define the study timeline:
    • This sets the period, and resolution, at which we represent animal movement
    • We may have observations regularly or irregularly along this timeline

Tip

The appropriate resolution of the timeline is study specific. Considerations include the study species’ movement capacity, the complexity of boundaries (e.g., islands), the resolution at which data were collected and computational resources. In passive acoustic telemetry studies, a reasonable starting point is to (approximately) align the resolution of the timeline with the transmission interval (≈2 mins).

Study system

  • Use assemble_timeline() to set a timeline based on the input datasets:
timeline <- assemble_timeline(.datasets = list(det, arc), 
                              .step = "2 mins", 
                              .trim = TRUE)
str(timeline)
 POSIXct[1:24225], format: "2016-03-17 01:50:00" "2016-03-17 01:52:00" "2016-03-17 01:54:00" ...

Study system

  • Set the timeline using seq.POSIXt():
timeline <- seq(as.POSIXct("2016-03-17 01:50:00", tz = "UTC"),
                as.POSIXct("2016-03-18 01:48:00", tz = "UTC"), 
                by = "2 mins")
str(timeline)
 POSIXct[1:720], format: "2016-03-17 01:50:00" "2016-03-17 01:52:00" "2016-03-17 01:54:00" ...

State

  • State is an Abstract Type in Patter.jl
  • This is an object that holds the parameters describing the ‘state’ of an individual
  • Typically, state means ‘location’ (in two or three dimensions), but additional parameters may be included
# The Julia code for the StateXY struct
struct StateXY <: State
    map_value::Float64
    x::Float64
    y::Float64
end 

State

  • In Patter.jl, the following sub-types are built-in:

    • StateXY(map_value, x, y): Used for two dimensional (x, y) states ;
    • StateXYZD(map_value, x, y, z, angle): Used for four-dimensional (x, y, z, direction) states;

Tip

It is straightforward new state types; see the help for ?State (or submit a GitHub issue).

State

  • In patter, you just need to specify the state sub-type as a character string:
    • "StateXY" maps onto StateXY in Julia
    • "StateXYZD" maps onto StateXYDZ in Julia
state <- "StateXY"
  • Each state-type maps onto a corresponding movement model

Movement model

  • ModelMove is an Abstract Type in Patter.jl
  • This is an object that holds the dimensions of the movement model, e.g.,
    • The map
    • A distribution of step lengths
    • A distribution of turning angles
# The Julia code for the ModelMoveXY struct
struct ModelMoveXY{T, U, V} <: ModelMove
    map::T
    dbn_length::U
    dbn_angle::V
end 

Movement model

  • The movement model can be instantiated via an R move_*() wrapper
    • StateXY maps onto ModelMoveXY, which is wrapped by move_xy()
    • StateXYZD maps onto ModelMoveXYZD, which is wrapped by move_xyzd()
  • The fields of the movement model in the move_*() wrapper must be specified using Julia code

Movement model

  • Define a two-dimensional random walk (x, y):
# Define maximum moveable speed between two time steps
mobility   <- 750.0

# Define 2D RW (XY)
model_move <- move_xy(dbn_length = glue("truncated(Gamma(1.0, 250.0), upper = {mobility})"),
                      dbn_angle = "Uniform(-pi, pi)")

Tip

In Julia, a number without a decimal point is an integer—which may occassionally cause trouble if a float is required. Specify floats by adding the .0.

Movement model

  • move_*() wrappers return a string of Julia code:
model_move
ModelMoveXY(env, truncated(Gamma(1.0, 250.0), upper = 750), Uniform(-pi, pi));

Tip

To check the parameterisation of other distributions, such as the Normal distribution in Julia, use JuliaCall::julia_help("Normal"). For new types of movement model; see the help for ?State (or submit a GitHub issue).

Note

In patter@dev, ModelMove structures and move_*() wrappers contain a mobility field.

Movement model

  • Visualise the components of our movement model:
expr <- expression({
  pp <- par(mfrow = c(1, 2))
  curve(dtrunc(x, spec = "gamma", a = 0, b = mobility, shape = 1.0, scale = 250.0), 
        from = 0, to = mobility, n = 1e3L,
        xlab = "Step length (m)", ylab = "Density")
  curve(dunif(x, -pi, pi),
        from = -pi - 0.1, to = pi + 0.1, n = 1e3L,
        xlab = "Heading (rad)", ylab = "Density")
  par(pp)
})

Movement model

  • Visualise the components of our movement model:

Movement model

  • Visualise realisations of the movement model:
paths <- sim_path_walk(.map = map, 
                       .timeline = timeline,
                       .state = state,
                       .model_move = model_move, 
                       .n_path = 2L, .one_page = TRUE)

Tip

sim_path_walk() can be used to simulate new trajectories or visualise a prior. The simulation is much faster and more flexible than many alternative routines (e.g., glatos::crw_in_poly()).

Important

In the particle filter, the movement model is a prior. It is a good idea to visualise realisations of the prior. The simulated trajectories should look sensible.

Movement model

  • Visualise realisations of the movement model:

Movement model

  • Visualise realisations of the movement model:
head(paths)
   path_id timestep           timestamp map_value        x       y
     <int>    <int>              <POSc>     <num>    <num>   <num>
1:       1        1 2016-03-17 01:50:00  29.49762 711942.1 6252507
2:       1        2 2016-03-17 01:52:00  29.79512 711958.2 6252289
3:       1        3 2016-03-17 01:54:00  29.79512 711894.8 6252344
4:       1        4 2016-03-17 01:56:00  28.81217 711897.6 6252587
5:       1        5 2016-03-17 01:58:00  28.40617 711864.7 6252596
6:       1        6 2016-03-17 02:00:00  29.17501 712094.4 6251995

Tip

Patter.jl automatically truncates the movement model by barriers (e.g., land); that is, we sample movements from a restricted movement model (in the filter) and we evaluate the probability density of truncated moves (in the smoother). With custom movement models, you need to specify Patter.simulate_step() method (for the filter) and a Patter.logpdf_step() (for the smoother). These methods simulate and evaluate the log-probability of an unrestricted step. Patter.jl handles the truncation for you (by Patter.simulate_move() and Patter.logpdf_move()).

Movement model

  • More sophisticated movement models are possible, including:
    • Random walks in two or three dimensions
    • Correlated random walks
    • Behavioural-switching models
  • See the help for ?State to get started

Movement model

  • This is an example of a more sophisticated movement model for flapper skate:

Example behavioural-switching model.

Movement model

  • There is a trade-off between simplicity and complexity
  • Static parameters are assumed known
  • Where parameters are unknown, you can:
    1. See Lavender et al. (2024a) for guidance on parameter sensitivity
    2. Run a sensitivity analysis:
      • Use simulations
      • Compare results from real-world analyses
    3. Formally propagate uncertainty with multiple algorithm runs
    4. Data-driven parameter estimation (not currently implemented)

Observation model(s)

  • We have defined the movement model
  • This encodes the process of animal movement
  • We now need to define the observation models
  • These encode the probability of the observations, given the latent (unobserved) location of the animal

Observation model(s)

  • As a reminder, we have collected detections and depth time series:
head(det, 2)
             timestamp receiver_id
                <POSc>       <int>
1: 2016-03-17 01:50:00          26
2: 2016-03-17 01:52:00          26
head(arc, 2)
             timestamp  depth
                <POSc>  <num>
1: 2016-03-16 00:00:00 159.15
2: 2016-03-16 00:02:00 161.01

Observation model(s)

  • Each dataset should comprise a time series of observations for one individual
  • Datasets should be pre-processed as required: e.g.,
    • Check receiver locations are valid on the map
    • Identify false detections
  • patter accepts any kind of observational dataset

Note

For passive acoustic telemetry and/or archival data, the pat_setup_data() function is available to validate the format of these data types for use in patter;

Observation model(s)

  • Each dataset should be provided as a data.table with the following columns:
    • sensor_id, timestamp, obs;
    • Additional columns that define observation-model parameters;

Tip

See assemble_acoustics for full details. Note that the resolution of the time stamps must match the resolution of the timeline (but irregular observations are fine).

Observation model(s)

  • In passive acoustic telemetry datasets, we record detections but the observations comprise detections and non-detections
  • Use the assemble_acoustics() helper to assemble the acoustic (detection, non-detection) time series:
acc <- assemble_acoustics(.timeline  = timeline,
                          .acoustics = det,
                          .moorings  = dat_moorings, 
                          .services  = NULL)

Note

The .acoustics argument will soon be replaced with .detections for the reasons mentioned previously.

Observation model(s)

head(acc)
             timestamp sensor_id   obs receiver_x receiver_y receiver_alpha
                <POSc>     <int> <int>      <num>      <num>          <num>
1: 2016-03-17 01:50:00         3     0   706442.1    6254007              4
2: 2016-03-17 01:50:00         4     0   709742.1    6267707              4
3: 2016-03-17 01:50:00         7     0   708742.1    6269107              4
4: 2016-03-17 01:50:00         9     0   706042.1    6254307              4
5: 2016-03-17 01:50:00        11     0   707542.1    6267707              4
6: 2016-03-17 01:50:00        12     0   710042.1    6267307              4
   receiver_beta receiver_gamma
           <num>          <num>
1:         -0.01            750
2:         -0.01            750
3:         -0.01            750
4:         -0.01            750
5:         -0.01            750
6:         -0.01            750

Note

assemble_acoustics() accounts for variable receiver deployment periods and servicing events. Duplicate observations (that is, detections at the same receiver in the same time step) are dropped. If available in .moorings, additional columns (receiver_alpha, receiver_beta and receiver_gamma) are included as required for the default acoustic observation model (ModelObsAcousticLogisTrunc). But you have the flexibility to define receiver- and time-varying parameters by modifying the data.table as required.

Observation model(s)

  • If you have passive acoustic telemetry observations, you should also use acoustic containers
  • The syntax is:
# ModelObsAcousticContainer
containers <- assemble_acoustics_containers(.acoustics, .mobility, ...)

Note

Acoustic containers will soon be available in patter@dev. They enhance the efficiency of the particle filter (reducing the number of particles required to achieve convergence).

Observation model(s)

  • We have to assemble other datasets with the required columns (sensor_id, timestamp, obs, ...) manually
head(arc, 2)
             timestamp  depth
                <POSc>  <num>
1: 2016-03-16 00:00:00 159.15
2: 2016-03-16 00:02:00 161.01
arc <- 
  arc |> 
  # Define 'dummy' sensor_id
  mutate(sensor_id = 1L) |> 
  # Observations must be in the 'obs' column
  rename(obs = "depth") |> 
  # Select columns
  select("sensor_id", "timestamp", "obs") |>
  # Add observation model parameters 
  mutate(depth_sigma = 50, depth_deep_eps = 20) |>
  as.data.table()

Observation model(s)

  • It is a good idea to pass other datasets through the assemble_archival() function for processing/checks though
arc <- assemble_archival(.timeline = timeline, .archival = arc)
head(arc)
   sensor_id           timestamp   obs depth_sigma depth_deep_eps
       <int>              <POSc> <num>       <num>          <num>
1:         1 2016-03-17 01:50:00 73.78          50             20
2:         1 2016-03-17 01:52:00 73.32          50             20
3:         1 2016-03-17 01:54:00 73.32          50             20
4:         1 2016-03-17 01:56:00 73.32          50             20
5:         1 2016-03-17 01:58:00 73.55          50             20
6:         1 2016-03-17 02:00:00 68.70          50             20

Observation model(s)

  • For each dataset, we need to define the corresponding ModelObs sub-type
  • A ModelObs sub-type is a structure that holds the parameters of an observation model
  • ModelObs sub-types are instantiated in Julia using the data.table(s) assembled above
  • We use ModelObs instances to simulate observations or to evaluate the log-probability of observations, given latent location

Observation model(s)

  • The following ModelObs structures are built-in to Patter.jl:
    • ModelObsAcousticLogisTrunc
    • ModelObsDepthNormalTrunc
    • ModelObsDepthUniform

Tip

It is straightforward to define new ModelObs sub-types for new data types; see ?ModelObs (or submit a GitHub issue).

Observation model(s)

  • ModelObsAcousticLogisTrunc is designed for acoustic observations
  • The structure holds the parameters of truncated logistic detection probability model
  • This contains the fields we need to simulate an observation, or evaluate the log-probability of an observation, given a transmission from a particular location
# Julia code for the ModelObsAcousticLogisTrunc sub-type
struct ModelObsAcousticLogisTrunc <: ModelObs
    sensor_id::Int64
    receiver_x::Float64
    receiver_y::Float64
    receiver_alpha::Float64
    receiver_beta::Float64    
    receiver_gamma::Float64
end

Observation model(s)

  • In a truncated logistic detection probability model:
    • receiver_alpha and receiver_beta affect the rate of decline in detection probability with distance from a receiver
    • receiver_gamma is the maximum detection range

Observation model(s)

Example truncated logistic detection probability models

Observation model(s)

  • You can explore different parameters values using this code:
expr <- expression({
  # Define parameters 
  receiver_alpha <- 4
  receiver_beta  <- -0.01
  receiver_gamma <- 1000
  # Visualise model 
  curve(ifelse(x <= receiver_gamma, 
              1 / (1 + exp(-(receiver_alpha + receiver_beta * x))), 
              0), 
       from = 0, to = receiver_gamma, 
       xlab = "Distance (m)", ylab = "Detection probability")
})

Observation model(s)

eval(expr)

Observation model(s)

  • ModelObsDepthNormalTrunc is a ModelObs structure for a depth observation and a truncated normal model
  • This is a suitable depth-observation model for benthic/demersal species
  • The individual must be located in an envelope around the bathymetric depth, defined by a normal distribution centred at this location, truncated at depth_deep_eps m below the seabed and 0 m, with standard deviation depth_sigma

Observation model(s)

Example truncated logistic detection probability models

Observation model(s)

  • ModelObsDepthUniform is ModelObs structure for a depth observation and a uniform depth model
  • This model assumes that an individual must be located in an envelope around the bathymetric depth, defined by two error terms (depth_shallow_eps and depth_deep_eps)
  • This model can be tailored for benthic or pelagic species

Observation model(s)

  • For the particle filter, multiple sets of observations should be provided as a named list
  • List names define the model sub-types
yobs <- list(ModelObsAcousticLogisTrunc = acc, 
             ModelObsDepthNormalTrunc = arc)

Note

In Julia, the list of observations is translated into a hash table of ModelObs sub-types.

Observation model(s)

  • As for the movement model, the parameters of the observation model(s) are assumed known

Particle filter

  • We have now defined:
    1. Study area map
    2. Study timeline
    3. State
    4. Movement model
    5. Observations/observation model parameters/observation model sub-types
  • We are now in a position to run the particle filter!

Particle filter

  • At \(t = 1\):

    1. Sample initial particles (locations → States), via simulate_states_init()
    2. For each simulated particle, compute the (log) weight, via Patter.logpdf_obs()
  • For \(t \in 2, ..., T\)

    1. Simulate animal movement, generating new particles (locations → States), via Patter.logpdf_step()
    2. For each simulated particle, compute the (log) weight, via Patter.logpdf_obs()
    3. Periodically, re-sampled with replacement

Tip

For custom State, ModelMove and/or ModelObs structures, associated methods for corresponding Julia routines are required (see ?State, ?ModelMove and ?ModelObs for guidance).

Particle filter

  • The particle filter is implemented via pf_filter() with the following arguments:
pf_filter(
  .map,
  .timeline,
  .state = "StateXY",
  .xinit = NULL,
  .xinit_pars = list(),
  .yobs = list(),
  .model_obs,
  .model_move = move_xy(),
  .n_move = 100000L,
  .n_particle = 1000L,
  .n_resample = as.numeric(.n_particle),
  .n_record = 1000L,
  .direction = c("forward", "backward"),
  .verbose = getOption("patter.verbose")
)

Note

In patter@dev, .map, .xinit_pars and .model_obs are no longer required.

Particle filter

Argument Description
.map The map
.timeline The timeline
.state The state
.xinit, .xinit_pars Arguments to set initial location
.yobs, .model_obs The named list of observations (and a character vector of corresponding ModelObs sub-types)
.n_move Set to default
.n_particle The number of particles
.n_resample Set to default
.n_record Set to default
.direction The direction of the particle filter
.verbose Show user output

Particle filter

  • To set initial states:
    1. Specify known states via .xinit (or .map)
    2. Let patter sample states automatically from part of the .map that is compatible with the initial observations

Tip

If you know the initial location(s) of the animal, provide this information via .xinit (or .map). If .xinit is not provided, initial locations are sampled uniformly from the part of the .map that is compatible with the initial observations. In patter@dev, this happens in Julia. You can set a unique initial map via set_map(..., .as_Raster = TRUE, .as_GeoArray = FALSE). For custom ModelObs subtypes, map_init methods are required for automated sampling of the part of the map compatible with observations. In patter v.1.0.0, these methods should be provided in R, in patter@dev, methods written in Julia are required.

Particle filter

  • Let’s set up the filter for our example time series:
# List filter arguments
args <- list(.map         = map,
             .timeline    = timeline,
             .state       = state,
             .xinit_pars  = list(mobility = mobility),
             .yobs        = yobs,
             .model_obs   = names(yobs),
             .model_move  = model_move,
             .n_particle  = 1e5L)

Tip

How many particles do we need? Recall that the particles approximate the distribution of an individual’s State at every time step (\(f(\mathbf{s}_t \mid \mathbf{y}_{1:t})\) in the filter). The number of particles required to approximate a distribution depends on its dimensionality. For a two-dimensional distribution, ≈500 particles may be sufficient. In practice, in the filter, a larger number of particles may be required to ensure that enough particles remain ‘alive’ at each time step to approximate the distribution. With acoustic observations alone, relatively few particles are required. (For passive acoustic telemetry data, the ModelObsAcousticContainer structure will soon be included in patter@dev. Use it to achieve convergence with fewer particles than used here (≤ 25,000)). With additional observations, more particles may be required depending on the complexity of the landscape that particles have to explore. This is associated with a (linear) speed penalty.

Particle filter

  • Let’s run the filter!
# Forward run
fwd <- do.call(pf_filter, args, quote = TRUE)

Warning

We are approximating the (partial) marginal distribution of the individual’s location, given the data up to and including that time, at every time step (every two minutes here!). This is awesome and the Julia code is very fast, but it is a lot to ask and for large numbers of particles, observations and time steps, the filter may take minutes or hours to run (especially on older computers). On MacOS, Julia progress bars are shown. On Windows, this is not possible and you should run some time trials (e.g., with a shorter timeline) to gauge likely computation time.

Particle filter

  • The filter returns a pf_particles-class object
  • This is a named list:
  class(fwd)
[1] "list"         "pf_particles"
  summary(fwd)
            Length Class      Mode   
xinit       3      data.table list   
states      6      data.table list   
diagnostics 4      data.table list   
convergence 1      -none-     logical

Note

In patter@dev, .xinit is gone and trials is added.

Particle filter

  • states is a data.table of particles:
head(fwd$states)
   path_id timestep           timestamp map_value        x       y
     <int>    <int>              <POSc>     <num>    <num>   <num>
1:       1        1 2016-03-17 01:50:00 113.60658 708742.1 6253207
2:       1        2 2016-03-17 01:52:00  56.53801 709332.3 6253208
3:       1        3 2016-03-17 01:54:00 116.70552 709077.7 6253559
4:       1        4 2016-03-17 01:56:00  83.21526 709087.0 6253182
5:       1        5 2016-03-17 01:58:00  80.03239 709101.3 6253274
6:       1        6 2016-03-17 02:00:00  75.20494 709221.9 6253450

Warning

The path_id is simply an index; it is named path_id for consistency with sim_path_walk(). These are not trajectories.

Particle filter

  • Map particle coordinates:
pf_plot_xy(.map = map, .coord = fwd$states, .steps = 1L, 
           .add_points = list(pch = ".", col = "red"))

Tip

In pf_filter(), at each time step, .n_record particles are sampled with replacement, in line with the weights, and saved in memory. All recorded particles thus have equal weight.

Particle filter

Particle filter

  • diagnostics is a data.table of filter diagnostics:
fwd$diagnostics
     timestep           timestamp      ess     maxlp
        <int>              <POSc>    <num>     <num>
  1:        1 2016-03-17 01:50:00 37374.70 -4.640798
  2:        2 2016-03-17 01:52:00 54798.57 -4.619861
  3:        3 2016-03-17 01:54:00 60377.26 -4.619686
  4:        4 2016-03-17 01:56:00 48971.43 -4.251760
  5:        5 2016-03-17 01:58:00 46014.97 -4.621204
 ---                                                
716:      716 2016-03-18 01:40:00 81200.61 -4.408220
717:      717 2016-03-18 01:42:00 85432.21 -4.408251
718:      718 2016-03-18 01:44:00 88933.96 -4.408215
719:      719 2016-03-18 01:46:00 88393.27 -4.408220
720:      720 2016-03-18 01:48:00 88811.31 -4.408224

Particle filter

  • maxlp is the maximum log-posterior at each time step (assuming resampling @ every time step):
plot(fwd$diagnostics$timestamp, fwd$diagnostics$maxlp, 
     xlab = "Time stamp", ylab = "maxlp",
     type = "l")

Tip

exp(maxlp) is the highest likelihood score at each time step (0–1). Watch out for time steps with very low maxlp. For passive acoustic telemetry analyses, ModelObsAcousticTruncLogis helps to prevent this via the receiver_gamma threshold.

Particle filter

Particle filter

  • ess is the effective sample size (ESS):
plot(fwd$diagnostics$timestamp, fwd$diagnostics$ess, 
     xlab = "Time stamp", ylab = "ESS",
     type = "l")
points(det$timestamp, rep(0, nrow(det)),
       pch = 21, col = "red", bg = "red", cex = 0.5)
abline(h = 500, col = "royalblue", lty = 3)

Tip

For a 2D distribution (which we have here), ideally the ESS should be >= 500 particles to achieve a reasonable approximation. Small ESS may suggest we need to increase the number of particles (or revise our assumptions). However, small ESS is not necessarily problematic, as there may be time steps when there were only a small number of possible locations in which the individual could have been located.

Particle filter

Particle filter

  • convergence defines whether or not the algorithm converged
fwd$convergence
[1] TRUE

Warning

Convergence failures produce a warning. These may occur for a variety of reasons including (1) code bugs, (2) prior–data conflicts, (3) incorrectly specified observation models and (4) excessive particle death. With acoustic observations alone, likely causes are 1–3. Excessive particle death (4) may occur in complex, labyrinthine landscapes, where particles have to explore a large number of possible routes of which only a tiny fraction are ultimately compatible with data.

Particle filter

  • The particle filter approximates the partial marginal distribution of the individual’s location given the data up to and including each time step, that is, \(f(\mathbf{s}_t \mid \mathbf{y}_{1:t})\)
  • If we run the particle filter forwards and backwards, we can use the two-filter smoother to account for all data, that is, \(f(\mathbf{s}_t \mid \mathbf{y}_{1:T})\)
  • In most realistic scenarios, this substantially improves maps of space use

Particle filter

  • Run the filter backwards:
# Backward run
args$.direction <- "backward"
bwd <- do.call(pf_filter, args, quote = TRUE)

Tip

Take care when specifying initial locations for the backward run.

Two-filter smoother

  • Run the two-filter smoother using outputs from pf_filter() in the Julia environment:
smo <- pf_smoother_two_filter(.n_particle = 100L, .n_sim = 100L)

Note

pf_smoother_two_filter() accepts map and .mobility arguments, but these have been replaced in patter@dev with .vmap. Use .vmap with two-dimensional states to boost speed.

Two-filter smoother

Argument Description
.n_particle The number of particles from the filter used in the algorithm
.n_sim The number of Monte Carlo simulations used to compute the normalisation constant (when required)

Note

Smoothing is expensive (\(\mathcal{O}(TN^2)\)). In this example, we use .n_particle = 100 for speed only. For real analyses, .n_particle = 1000 is generally acceptible. .n_sim = 100 is reasonable.

Note

Smoothing uses the probability density of movements between pairs of locations. The density of an unrestricted move is evaluated by Patter.logpdf_step(). For custom movement models, a Patter.logpdf_step() method is required.

Two-filter smoother

  • pf_smoother_two_filter() returns the familiar pf_particles-class object:
class(smo)
[1] "list"         "pf_particles"
summary(smo)
            Length Class      Mode   
xinit       0      -none-     NULL   
states      6      data.table list   
diagnostics 4      data.table list   
convergence 1      -none-     logical

Two-filter smoother

  • We can analyse the outputs in the same way as those from the filter:
head(smo$states)
   path_id timestep           timestamp map_value        x       y
     <int>    <int>              <POSc>     <num>    <num>   <num>
1:       1        1 2016-03-17 01:50:00  68.53316 709262.9 6253335
2:       1        2 2016-03-17 01:52:00  75.20494 709235.3 6253358
3:       1        3 2016-03-17 01:54:00  56.53801 709310.5 6253250
4:       1        4 2016-03-17 01:56:00  80.03239 709132.5 6253351
5:       1        5 2016-03-17 01:58:00  84.47292 709281.4 6253504
6:       1        6 2016-03-17 02:00:00  80.03239 709163.0 6253326

Two-filter smoother

  • diagnostics includes ESS:
head(smo$diagnostics)
   timestep           timestamp      ess maxlp
      <int>              <POSc>    <num> <num>
1:        1 2016-03-17 01:50:00      NaN   NaN
2:        2 2016-03-17 01:52:00 34.94540   NaN
3:        3 2016-03-17 01:54:00 80.75311   NaN
4:        4 2016-03-17 01:56:00 51.64853   NaN
5:        5 2016-03-17 01:58:00 42.20092   NaN
6:        6 2016-03-17 02:00:00 79.73982   NaN

Two-filter smoother

  • convergence is necessarily TRUE:
smo$convergence
[1] TRUE

Mapping

  • We can compute probability-of-use (POU) using particle samples from the filter (for comparison) or the smoother (recommended):
map_pou(.map = map, .coord = smo$states)$ud

Mapping

Mapping

  • In practice, POU is sensitive to grid resolution and kernel smoothing may be desirable
map_dens(
  .map,
  .owin = as.owin.SpatRaster(.map),
  .coord = NULL,
  .discretise = FALSE,
  .shortcut = list(),
  ...,
  .fterra = FALSE,
  .plot = TRUE,
  .use_tryCatch = TRUE,
  .verbose = getOption("patter.verbose")
)

Mapping

  • map_dens() implements kernel smoothing via spatstat.explore::density.ppp()
    • Use sigma to choose the method to estimate the bandwidth
    • E.g., bw.diggle implements cross validation (expensive)
    • Edge-correction is automatically implemented

Warning

The default sigma in density.ppp() depends only default value of sigma calculated by a simple rule of thumb that depends only on the size of the window.

Tip

To improve speed: specify .owin, using a simplified sf window, set .discretise = TRUE and use .fterra = TRUE. Use a simpler sigma estimator, such as the reference bandwidth estimator function(X) sqrt(0.5 * (var(X$x) + var(X$y))) * (X$n^(-1/6)).

Mapping

  • Implement kernel smoothing with smoothed particles:
# Estimate UD
ud <- map_dens(.map = map,
               .coord = smo$states,
               sigma = bw.diggle)$ud

# Add home range
map_hr_home(ud, .add = TRUE)

Important

This map is biologically and probabilistically sound utilisation distribution. Cell colours denote the probability density of observing an individual in a given grid cell at a randomly chosen time.

Warning

Don’t confuse particle smoothing with post-hoc kernel smoothing!

Mapping

Residency

  • We can easily compute residency in a particular area (at any arbitrary time-scale of our choosing) as the proportion of particles inside/outside an area:
# Define an example polygon (e.g., MPA)
spatpoly <- 
  cbind(708433.9, 6251765) |> 
  terra::vect(crs = terra::crs(map)) |> 
  terra::buffer(width = 2000)
# Visualise study area and 'MPA'
terra::plot(map)
points(smo$states$x, smo$states$y, pch = ".", col = "red")
terra::lines(spatpoly, col = "black", lwd = 2)

Residency

Residency

# Lookup whether or not particles are in the polygon
spatcoord <- terra::vect(cbind(smo$states$x, smo$states$y))
smo$states[, in_poly := 
               terra::relate(spatcoord, spatpoly, relation = "intersects")]
               
# Calculate the expected time spent in the 'MPA' (%)
table(smo$states$in_poly) / nrow(smo$states) * 100

   FALSE     TRUE 
71.76111 28.23889 

Important

This estimate of residency is probabilistically sound.

Recap

Boilerplate setup (1/7)

# Load & attach package 
library(patter)

# Load datasets
head(det, 2)
head(arc, 2)

# Connect to Julia 
julia_connect()
set_seed()

# Set map 
map <- dat_gebco()
set_map()

# Study timeline 
timeline <- assemble_timeline(.datasets = list(det, arc))

Set state (2/7)

state <- "StateXY"

Set movement model (3/7)

mobility   <- 750.0
model_move <- move_xy(dbn_length = glue("truncated(Gamma(1.0, 250.0), upper = {mobility})"),
                      dbn_angle = "Uniform(-pi, pi)")

Set observation model(s) (4/7)

# Acoustic observations 
acc <- assemble_acoustics(.timeline  = timeline,
                          .acoustics = det,
                          .moorings  = dat_moorings, 
                          .services  = NULL)

# Archival observations 
arc <- assemble_archival(.timeline = timeline, 
                         .archival = 
                           arc |> 
                             rename(obs = "depth") |> 
                             select("sensor_id", "timestamp", "obs") |>
                             mutate(depth_sigma = 50, depth_deep_eps = 20) |> 
                           as.data.table())

# All observations
yobs <- list(ModelObsAcousticLogisTrunc = acc, 
             ModelObsDepthNormalTrunc = arc)

Run particle filter (5/7)

# List filter arguments
args <- list(.map         = map,
             .timeline    = timeline,
             .state       = state,
             .xinit_pars  = list(mobility = mobility),
             .yobs        = yobs,
             .model_obs   = names(yobs),
             .model_move  = model_move,
             .n_particle  = 1e5L)

# Forward run
fwd <- do.call(pf_filter, args, quote = TRUE)

# Backward run
args$.direction <- "backward"
bwd <- do.call(pf_filter, args, quote = TRUE)

Run particle smoother (6/7)

smo <- pf_smoother_two_filter()

Analyse outputs (7/7)

map_dens(.map = map, .coord = smo, sigma = bw.diggle)

Simulation-based workflow

Simulation-based workflow

  • patter also supports simulation-based workflows:
    1. sim_array() simulates arrays
    2. sim_path_walk() simulates movement paths
    3. sim_observations() simulates observations
  • You can model simulated observations

Important

Simulation studies inform real-world analyses.

Simulate arrays

  • Use sim_array() to simulate randomly or regularly arranged receivers
  • Locations are sampled from .map
  • Default observation model parameters (receiver_alpha, receiver_beta, receiver_gamma) can be assigned
moorings <- sim_array(.map = map, 
                      .timeline = timeline,
                      .arrangement = "random",
                      .n_receiver = 100L, 
                      .receiver_alpha = 5, 
                      .receiver_beta = -0.01, 
                      .receiver_gamma = 1200, 
                      .n_array = 1L, 
                      .plot = TRUE)

Simulate arrays

Simulate arrays

head(moorings)
   array_id receiver_id      receiver_start        receiver_end receiver_x
      <int>       <int>              <POSc>              <POSc>      <num>
1:        1           1 2016-03-17 01:50:00 2016-03-18 01:48:00   702042.1
2:        1           2 2016-03-17 01:50:00 2016-03-18 01:48:00   709342.1
3:        1           3 2016-03-17 01:50:00 2016-03-18 01:48:00   707842.1
4:        1           4 2016-03-17 01:50:00 2016-03-18 01:48:00   711042.1
5:        1           5 2016-03-17 01:50:00 2016-03-18 01:48:00   705842.1
6:        1           6 2016-03-17 01:50:00 2016-03-18 01:48:00   708842.1
   receiver_y receiver_alpha receiver_beta receiver_gamma
        <num>          <num>         <num>          <num>
1:    6249607              5         -0.01           1200
2:    6268407              5         -0.01           1200
3:    6269107              5         -0.01           1200
4:    6267107              5         -0.01           1200
5:    6252607              5         -0.01           1200
6:    6253707              5         -0.01           1200

Simulate paths

  • We have already seen how to simulate paths
# Set state and movement model 
state      <- "StateXY"
mobility   <- 500.0
model_move <- 
  move_xy(dbn_length = "truncated(Normal(50, 250.0), lower = 0, upper = {mobility})", 
                      dbn_angle = "VonMises(0, 0.25)")

# Simulate path using movement model 
pp <- par(mfrow = c(1, 2))
path_coord <- sim_path_walk(.map = map, 
                            .timeline = timeline, 
                            .state = state, 
                            .n_path = 1L, 
                            .model_move = model_move)

# Fit UD using cross validation
path_ud <- map_dens(.map = map, .coord = path_coord, .discretise = TRUE)$ud
par(pp)

Simulate paths

Simulate paths

head(path_coord)
   path_id timestep           timestamp map_value        x       y
     <int>    <int>              <POSc>     <num>    <num>   <num>
1:       1        1 2016-03-17 01:50:00  54.62202 708042.1 6251707
2:       1        2 2016-03-17 01:52:00  54.17428 708113.0 6251676
3:       1        3 2016-03-17 01:54:00  50.58004 708358.7 6251598
4:       1        4 2016-03-17 01:56:00  48.12802 708078.8 6251506
5:       1        5 2016-03-17 01:58:00  53.05067 708200.9 6251690
6:       1        6 2016-03-17 02:00:00  50.70027 708233.6 6251503

Simulate observations

  • To simulate observations, we need to choose our observation models and their parameters
  • For each observation, we need to provide parameters as a data.table that defines, for each sensor_id (e.g., receiver), the observation-model parameters associated with that sensor

Note

Time-varying observation model parameters are not currently supported in sim_observations() (but are fully supported in pf_filter()).

Simulate observations

  • Let’s set things up to simulate acoustic observations:
# ModelObsAcousticLogisTrunc 
# > We have multiple acoustic sensors
# > This data.table includes the relevant parameters
pars_ModelObsAcousticLogisTrunc <-
    moorings |>
    select(sensor_id = "receiver_id",
           "receiver_x", "receiver_y",
           "receiver_alpha", "receiver_beta", "receiver_gamma") |>
    as.data.table()

Simulate observations

  • And depth observations:
# ModelObsDepthNormalTrunc
# > We have one depth sensory 
# > This data.table includes the relevant parameters 
pars_ModelObsDepthNormalTrunc <- data.frame(sensor_id = 1L,
                                            depth_sigma = 20,
                                            depth_deep_eps = 20)

Tip

Use ?ModelObs or JuliaCall::julia_help("ModelObs") for information on built-inModelObs` structures and their parameters.

Tip

Of course, we can define and use custom ModelObs sub-types. Just define the sub-type and an associated Patter.simulate_obs method. See the help for ?ModelObs for examples.

Simulate observations

  • Simulate observations along .timeline via sim_observations():
obs <- sim_observations(
  .timeline = timeline, 
  .model_obs = c("ModelObsAcousticLogisTrunc",
                 "ModelObsDepthNormalTrunc"),
  .model_obs_pars = list(pars_ModelObsAcousticLogisTrunc, 
                         pars_ModelObsDepthNormalTrunc))

Note

For each sensor, an observation is simulated for each time step. Irregular observations are not not currently supported in sim_observations() (but are fully supported in pf_filter()).

Simulate observations

  • sim_observations() returns a list, with one element for every .model_obs
  • Each element is a list, with one sub-element for each simulated path
  • Each sub-element is a data.table that contains the observations

Simulate observations

str(obs)
List of 2
 $ ModelObsAcousticLogisTrunc:List of 1
  ..$ :Classes 'data.table' and 'data.frame':   72000 obs. of  8 variables:
  .. ..$ timestamp     : POSIXct[1:72000], format: "2016-03-17 01:50:00" "2016-03-17 01:50:00" ...
  .. ..$ obs           : int [1:72000] 0 0 0 0 0 0 0 0 0 0 ...
  .. ..$ sensor_id     : int [1:72000] 1 2 3 4 5 6 7 8 9 10 ...
  .. ..$ receiver_x    : num [1:72000] 702042 709342 707842 711042 705842 ...
  .. ..$ receiver_y    : num [1:72000] 6249607 6268407 6269107 6267107 6252607 ...
  .. ..$ receiver_alpha: num [1:72000] 5 5 5 5 5 5 5 5 5 5 ...
  .. ..$ receiver_beta : num [1:72000] -0.01 -0.01 -0.01 -0.01 -0.01 -0.01 -0.01 -0.01 -0.01 -0.01 ...
  .. ..$ receiver_gamma: num [1:72000] 1200 1200 1200 1200 1200 1200 1200 1200 1200 1200 ...
  .. ..- attr(*, ".internal.selfref")=<externalptr> 
 $ ModelObsDepthNormalTrunc  :List of 1
  ..$ :Classes 'data.table' and 'data.frame':   720 obs. of  5 variables:
  .. ..$ timestamp     : POSIXct[1:720], format: "2016-03-17 01:50:00" "2016-03-17 01:52:00" ...
  .. ..$ obs           : num [1:720] 46.4 14.8 28.5 42.6 61.1 ...
  .. ..$ sensor_id     : int [1:720] 1 1 1 1 1 1 1 1 1 1 ...
  .. ..$ depth_sigma   : num [1:720] 20 20 20 20 20 20 20 20 20 20 ...
  .. ..$ depth_deep_eps: num [1:720] 20 20 20 20 20 20 20 20 20 20 ...
  .. ..- attr(*, ".internal.selfref")=<externalptr> 

Simulate observations

acc <- obs$ModelObsAcousticLogisTrunc[[1]]
head(acc)
             timestamp   obs sensor_id receiver_x receiver_y receiver_alpha
                <POSc> <int>     <int>      <num>      <num>          <num>
1: 2016-03-17 01:50:00     0         1   702042.1    6249607              5
2: 2016-03-17 01:50:00     0         2   709342.1    6268407              5
3: 2016-03-17 01:50:00     0         3   707842.1    6269107              5
4: 2016-03-17 01:50:00     0         4   711042.1    6267107              5
5: 2016-03-17 01:50:00     0         5   705842.1    6252607              5
6: 2016-03-17 01:50:00     0         6   708842.1    6253707              5
   receiver_beta receiver_gamma
           <num>          <num>
1:         -0.01           1200
2:         -0.01           1200
3:         -0.01           1200
4:         -0.01           1200
5:         -0.01           1200
6:         -0.01           1200

Simulate observations

arc <- obs$ModelObsDepthNormalTrunc[[1]]
head(arc)
             timestamp      obs sensor_id depth_sigma depth_deep_eps
                <POSc>    <num>     <int>       <num>          <num>
1: 2016-03-17 01:50:00 46.42963         1          20             20
2: 2016-03-17 01:52:00 14.84555         1          20             20
3: 2016-03-17 01:54:00 28.52349         1          20             20
4: 2016-03-17 01:56:00 42.62775         1          20             20
5: 2016-03-17 01:58:00 61.07769         1          20             20
6: 2016-03-17 02:00:00 34.44895         1          20             20

Simulate observations

# Plot depths
  plot(arc$timestamp, arc$obs * -1, 
       col = "royalblue", type = "l",
       ylim = c(-max(arc$obs), 0), 
       xlab = "Time stamp", ylab = "Depth (m)")
# Add detections 
det <- acc[obs == 1L, ]
points(det$timestamp, rep(0, nrow(det)), col = "red")

Simulate observations

Run algorithms

  • Now we can run the algorithms and compare the simulated pattern of space use to the reconstructed pattern:
yobs <- list(ModelObsAcousticLogisTrunc = acc,
             ModelObsDepthNormalTrunc = arc)

# List filter arguments
args <- list(.map         = map,
             .timeline    = timeline,
             .state       = state,
             .xinit_pars  = list(mobility = mobility),
             .yobs        = yobs,
             .model_obs   = names(yobs),
             .model_move  = model_move,
             .n_particle  = 5e4L, 
             .verbose     = FALSE)

# Forward run
fwd <- do.call(pf_filter, args, quote = TRUE)

# Backward run
args$.direction <- "backward"
bwd <- do.call(pf_filter, args, quote = TRUE)

# Run smoother 
smo <- pf_smoother_two_filter(.n_particle = 100)

Mapping

pp <- par(mfrow = c(1, 3))

# Plot the UD for the simulated path 
terra::plot(path_ud)
map_hr_home(path_ud, .add = TRUE)

# Plot the estimated UD 
patter_ud <- map_dens(.map = map, 
                   .coord = smo$states, 
                   .discretise = TRUE, 
                   .fterra = TRUE,
                   sigma = bw.diggle)$ud
map_hr_home(patter_ud, .add = TRUE)

# Compare to a UD based on the COA algorithm
coa_ud <- map_dens(.map = map, 
                   .coord = coa(.map = map, 
                                .acoustics = det, 
                                .moorings = moorings, 
                                .delta_t = "2 hours", 
                                .plot_weights = FALSE), 
                   .discretise = FALSE, 
                   .fterra = TRUE, 
                   sigma = bw.diggle)$ud

par(pp)

Mapping

Residency

Advanced

Parallelisation

Custom States, ModelObs and ModelMove

Custom state -> custom movement model -> wrapper R move_*() function StateXYZ -> ModelMoveXYZ -> move_xyz() states_init.StateXYZ

Patter.simulate_step(state::StateXYZ, model::ModelMoveXYZ, …) logpdf_step(state_from::StateXYZ, state_to::StateXYZ, model_move::ModelMoveXYZ, …)

simulate_obs(state::StateXYZD, model::ModelObsDepthNormal, …) logpdf_obs(state::State, model::ModelObsDepthNormal, …)

Development

  • HPC, linux etc.

Issues